Maximal Associated Regression: A Nonlinear Extension to Least Angle Regression

نویسندگان

چکیده

This paper proposes Maximal Associated Regression (MAR), a novel algorithm that performs forward stage-wise regression by applying nonlinear transformations to fit predictor covariates. For each predictor, MAR selects between linear or additive as determined the dataset. The proposed is an adaptation of Least Angle (LARS) and retains its efficiency in building sparse models. Constrained penalized splines are used generate smooth for fits. A monotonically constrained extension (MARm) also introduced this isotonic problems. algorithms validated on both synthetic real datasets. performances MARm compared against LARS, Generalized Linear Models (GLM), Additive (GAM) under Gaussian assumption with unity link function. Results indicate MAR-type achieve superior subset selection accuracy, generating sparser models generalize well new data. able sample deficient Thus, valuable tool data exploration, especially when priori knowledge dataset unavailable.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Least Angle Regression

The purpose of model selection algorithms such as All Subsets, Forward Selection, and Backward Elimination is to choose a linear model on the basis of the same set of data to which the model will be applied. Typically we have available a large collection of possible covariates from which we hope to select a parsimonious set for the efficient prediction of a response variable. Least Angle Regres...

متن کامل

Extension of Logic regression to Longitudinal data: Transition Logic Regression

Logic regression is a generalized regression and classification method that is able to make Boolean combinations as new predictive variables from the original binary variables. Logic regression was introduced for case control or cohort study with independent observations. Although in various studies, correlated observations occur due to different reasons, logic regression have not been studi...

متن کامل

Discussion of “ Least Angle Regression ”

Being able to reliably, and automatically, select variables in linear regression models is a notoriously difficult problem. This research attacks this question head on, introducing not only a computationally efficient algorithm and method, LARS (and its derivatives), but at the same time introducing comprehensive theory explaining the intricate details of the procedure as well as theory to guid...

متن کامل

Robust groupwise least angle regression

Many regression problems exhibit a natural grouping among predictor variables. Examples are groups of dummy variables representing categorical variables, or present and lagged values of time series data. Since model selection in such cases typically aims for selecting groups of variables rather than individual covariates, an extension of the popular least angle regression (LARS) procedure to gr...

متن کامل

Discussion of Least Angle Regression

Algorithms for simultaneous shrinkage and selection in regression and classification provide attractive solutions to knotty old statistical challenges. Nevertheless, as far as we can tell, Tibshirani’s Lasso algorithm has had little impact on statistical practice. Two particular reasons for this may be the relative inefficiency of the original Lasso algorithm, and the relative complexity of mor...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Access

سال: 2021

ISSN: ['2169-3536']

DOI: https://doi.org/10.1109/access.2021.3131740